专利摘要:
VEHICLE OPERATING DEVICE. A vehicle operating device 8 is used in an autonomously controlled autonomous vehicle C to drive along a determined driving route, and includes a display unit 2 configured to show a current action and a next action performed after a lapse of time. predetermined time for an occupant of the autonomous vehicle C, and an instruction unit 1 configured to instruct the autonomous vehicle C to perform determined actions in accordance with the operation of the occupant.
公开号:BR112016021450A2
申请号:R112016021450-1
申请日:2015-02-04
公开日:2021-08-17
发明作者:Tsuyoshi Sakuma
申请人:Nissan Motor Co. Ltd.;
IPC主号:
专利说明:

[001] [001] The present invention relates to a vehicle operating device for operating an autonomous vehicle.
[002] [002] Autonomous vehicles are known to detect peripheral conditions of vehicles and autonomously take actions safely, such as lane changes and right/left turns, in order to drive along certain driving routes (002). see Patent Literature 1). In such an autonomous vehicle, when a new intention arises, such as when an occupant's purpose changes while driving or when the occupant is dissatisfied with a driving plan selected by a system, the occupant must instruct the system to change. the route or plane.
[003] [003] Patent Literature 1: Japanese Unexamined Patent Application Publication No. 2001-301484 Summary of the Invention
[004] [004] The addition of passing places or the change of routes or plans increases and complicates the steps of the operation, which prevents an immediate change from action to driving.
[005] [005] In view of the foregoing, an object of the present invention is to provide a vehicle operating device that allows an occupant to immediately alter an action performed by an autonomous vehicle.
[006] [006] A vehicle operating device is used in an autonomously controlled autonomous vehicle to drive along a determined driving route, and includes a display unit and an instruction unit. The 11/48 display unit shows, for an autonomous vehicle occupant, a current autonomous vehicle action and a next autonomous vehicle action performed after a predetermined period of time. The instruction unit designates intended actions according to an occupant operation as the actions performed by the autonomous vehicle shown in the display unit, and instructs the autonomous vehicle to perform the designated actions according to the occupant operation.
[007] [007] [Figure 1] Figure 1 is a block diagram to describe a fundamental definition of an autonomous vehicle according to an embodiment of the present invention.
[008] [008] [Figure 2] Figure 2 is a view showing an instruction unit and a display unit, to describe a first example of operation of a vehicle operating device included in the autonomous vehicle according to the embodiment of the present invention. .
[009] [009] [Figure 3] Figure 3 is a top view of the autonomous vehicle, to schematically describe a second example of operation of the vehicle operating device included in the autonomous vehicle according to the embodiment of the present invention.
[010] [010] [Figure 4] Figure 4 is a view showing the instruction unit and the display unit, to describe the second example of operation of the vehicle operating device included in the autonomous vehicle according to the modality of the present invention.
[011] [011] [Figure 5] Figure 5 is a view showing the instruction unit and the display unit, to describe the second example of operation of the vehicle operating device included in the autonomous vehicle according to the modality of the present invention.
[012] [012] [Figure 6] Figure 6 is a view showing the instruction unit and the display unit, to describe a third example of operation of the vehicle operating device included in the autonomous vehicle according to the modality of the present invention.
[013] [013] [Figure 7] Figure 7 is a top view of the autonomous vehicle to schematically describe a fourth example of operation of the vehicle operating device included in the autonomous vehicle according to the embodiment of the present invention.
[014] [014] [Figure 8] Figure 8 is a view showing the instruction unit and the display unit, to describe the fourth example of operation of the vehicle operating device included in the autonomous vehicle according to the modality of the present invention.
[015] [015] [Figure 9] Figure 9 is a top view of the autonomous vehicle to schematically describe a fifth example of operation of the vehicle operating device included in the autonomous vehicle according to the embodiment of the present invention.
[016] [016] [Figure 10] Figure 10 is a view showing the instruction unit and the display unit, to describe the fifth example of operation of the vehicle operating device included in the autonomous vehicle according to the modality of the present invention.
[017] [017] [Figure 11] Figure 11 is a view showing the instruction unit and the display unit, to describe the fifth example of operation of the vehicle operating device included in the autonomous vehicle according to the modality of the present invention.
[018] [018] [Figure 12] Figure 12 is a top view of the autonomous vehicle to schematically describe the fifth example of operation of the vehicle operating device included in the autonomous vehicle according to the modality of the present invention.
[019] [019] [Figure 13] Figure 13 (a) to Figure 13 (c) are views to describe instruction units included in an autonomous vehicle according to other embodiments of the present invention. DESCRIPTION OF MODALITIES
[020] [020] In the following, embodiments of the present invention will be described with reference to the drawings. The same or similar elements shown in the drawings are indicated by the same or similar reference numerals, and the descriptions that overlap are not repeated.
[021] [021] As shown in Figure 1, an autonomous vehicle C according to an embodiment of the present invention includes a driving unit 6 for accelerating and decelerating the autonomous vehicle C, a steering unit 7 for directing the autonomous vehicle C, and a vehicle operating device 8 for controlling the driving unit 6 and the steering unit 7 for controlling the autonomous vehicle C. The autonomous vehicle C is autonomously controlled to drive along a driving route determined by the driving device. vehicle operation 8.
[022] [022] Vehicle operation device 8 includes an instruction unit 1 that instructs autonomous vehicle C to drive in accordance with the operation performed by the occupant of autonomous vehicle C, a display unit 2 that provides information to the occupant of the autonomous vehicle C, a controller 3 that controls the respective components included in the autonomous vehicle C. The vehicle operating device 8 further includes an information acquisition unit 41 that acquires various types of information about autonomous driving, a unit detection unit 42 that detects peripheral information from the autonomous vehicle C, and a storage unit 5 that stores data necessary for processing 14/48 performed by controller 3.
[023] [023] The instruction unit 1 includes, for example, an input device that receives the operation performed by the occupant and inputs a signal corresponding to the operation to the controller 3. The presentation unit 2 includes a display device in which images and characters provided for the occupant are displayed, and an output device for playing voices such as a speaker. The display unit 2 shows the occupant a current action of the autonomous vehicle C and a next action taken after a predetermined period of time. The instruction unit 1 and the display unit 2 integrally serve as a touch panel display, for example.
[024] [024] Controller 3 includes a route processing unit 31 that implements control processing for a driving route along which autonomous vehicle C drives, an action processing unit 32 that implements control processing for actions of autonomous vehicle C, and an action determination unit 33 that determines whether to allow actions performed by autonomous vehicle C on the driving route. Controller 3 is, for example, a computer including a central processing unit (CPU) to implement calculation processing necessary for autonomous vehicle C. Controller 3, route processing unit 31, and processing unit 32 are indicated by elements having logical structures, and may be provided as standalone hardware elements or may be provided as an integrated hardware element. The controller 3 controls the autonomous vehicle C to drive along the driving route safely and regally, according to the information from the information acquisition unit 41, the detection unit 42, and the storage unit 5.
[025] [025] Route processing unit 31 sets a destination of autonomous vehicle C according to instruction by instruction unit 1, and searches for and 15/48 determines the driving route to the destination from a starting point based on route search conditions including departure point, destination and road information. Route search conditions may also include traffic information regarding the driving route and its periphery, time zones, a road classification, and priority issues relating to route determination.
[026] [026] The action processing unit 32 controls actions performed by the autonomous vehicle C, such as a forward movement, a right turn, a left turn, a lane change, and a stop. The action processing unit 32 shows the occupant, through the presentation unit 2, the action of the autonomous vehicle C currently performed on the driving route determined by the route processing unit 31 and the action of the autonomous vehicle C performed on the driving route determined by the route processing unit 31. given after a time lapse from the current time.
[027] [027] Action determination unit 33 determines whether to allow each action of autonomous vehicle C according to information acquired by information acquisition unit 41, information detected by detection unit 42, and traffic laws and regulations stored in storage unit 5.
[028] [028] Information acquisition unit 41 acquires information externally via wireless communication and inputs the information to the controller
[029] [029] The detection unit 42 includes sensors such as a camera, a 16/48 distance measuring device, and a speed meter. Sensors, when using electromagnetic waves, can detect various frequency bands, such as radio waves, infrared light and visible light. Detection unit 42 detects peripheral information from autonomous vehicle C, including other vehicles, obstacles, driving route alignments, road widths, signage, traffic signs, lane boundaries and road conditions, and inputs the information to the controller 3.
[030] [030] Storage unit 5 includes a storage device, such as a magnetic disk or semiconductor memory. Storage unit 5 stores programs necessary for processing implemented by controller 3, map data, and various types of data, such as traffic laws and regulations. Storage unit 5 can also serve as a transient storage medium for processing implemented by controller 3.
[031] [031] The first to fifth operating examples of vehicle operating device 8 will be described below, while exemplifying some situations. Each situation is described as a case where autonomous vehicle C is driving along a predetermined driving route. First example of operation
[032] [032] As shown in Figure 2, display unit 2 displays at least arrows A1 to A3 indicating forward movement, right movement, and left movement in three directions when autonomous vehicle C is in motion. driving along a predetermined driving route, and shows the occupant instructions for a current move and a next move after a lapse of time. The instruction unit 1 composes the touch panel display together with the display unit 2, so that the regions corresponding to arrows A1 to A3 can be operated by the occupant. When one of the arrows A1 to A3 indicating a direction for movement is operated by the occupant, instruction unit 1 instructs autonomous vehicle C to move in the direction indicated by the arrow operated by the occupant. The instruction unit 1 is thus configured in such a way that intended actions can be selected according to the occupant's operation from the actions of the autonomous vehicle C shown in the display unit 2.
[033] [033] In the example shown in Figure 2, the arrow A2 fully hatched denotes that the current action is a forward movement, and the arrow A3 with the peripheral edge only hatched denotes that the action performed after a lapse of time is a curve to the right. Arrow A2 denotes that the current action of autonomous vehicle C is a forward motion, and arrow A3 denotes that the next action after a lapse of time is a right turn. The indication of the respective arrows A1 to A3 varies depending on the control by the action processing unit 32, so that the occupant can distinguish the current action of the autonomous vehicle C and the next action performed after a lapse of time.
[034] [034] As shown in Figure 3, the autonomous vehicle C shown at position C0 is assumed to be moving in a straight line along the driving route in the left lane of the road with two lanes divided by a lane boundary on each direction, and approaching the intersection ahead. When autonomous driving is continued, autonomous vehicle C continues to move in a straight line at position C00 after a predetermined time lapse. Presentation unit 2 only shows the arrow A2 denoting that the current action is a forward movement. It is assumed that the occupant wants to make a right turn at the front intersection.
[035] [035] As shown in Figure 4, instruction unit 1 instructs route processing unit 18/48 of controller 3 to make a right turn according to the occupant operation performed in the region corresponding to arrow A3. The display unit 2 notifies the occupant of a status that the instruction unit 1 has been properly operated, by changing the arrow indication A3 during operation on the instruction unit 1. The display unit 2 can, for example, change colors between arrows A1 to A3 indicating current action and action after a time lapse and arrows A1 to A3 having been operated.
[036] [036] Route processing unit 31 changes the route to make a right turn according to instruction from instruction unit 1. Autonomous vehicle C starts making a direction change to the adjacent right lane , at the same time putting a turn signal on, according to the control by controller 3. As shown in Figure 5, the display unit 2 changes the indications of arrow A2 and arrow A3 once the direction change is started, and changes the current action indicated from forward motion to right turn. The current action shown on display unit 2 is indicated by arrow A3 until the right turn is completed, and the action returns to the indication by arrow A2 denoting forward movement once the right turn is completed .
[037] [037] The action determination unit 33 analyzes the safety and royalty of the change of direction to the right lane according to the peripheral information of the autonomous vehicle C detected by the detection unit 42 and the laws and regulations of traffic stored in storage unit 5. When action determining unit 33 determines that the change of direction to the right lane is possible, controller 3 controls driving unit 6 and steering unit 7 to bring the autonomous vehicle C to position C1 as shown in Figure 3, in order to complete the change of direction to the right lane.
[038] [038] Autonomous vehicle C is controlled by controller 3 to move straight in the right lane and then autonomously stop at position C2 in front of the intersection according to the peripheral information. Autonomous vehicle C then enters the intersection, as indicated by position C3, while autonomously maintaining safe driving, and further enters the crossroad to complete the right turn. When the autonomous vehicle C completes the right turn, the route processing unit 31 again searches and determines the driving route to the destination.
[039] [039] As shown in Figure 6, other additional marks are displayed on the display unit 2, so that the instruction unit 1 can give further instructions to the autonomous vehicle C. The display unit 2 displays six arrows B1 to B6. Arrows B1 to B6 indicate a change of direction to the left lane, a move forward, a change of direction to the right lane, a turn to the left, a stop, and a turn to the right. Arrows A1, A2, and A3 shown in Figure 2 correspond to arrows B4, B2, and B6, respectively.
[040] [040] Display unit 2 displays pattern marks on the lane boundaries between arrows B1, B2, and B3, respectively indicating left-forward direction, forward direction, and right-forward direction, so that the occupant can intuitively recognize what the respective arrows B1 and B3 indicate. Display unit 2 displays a patterned mark in a stop position line at the head of arrow B5 indicating the forward direction, so the occupant can intuitively recognize what arrow B5 indicates. The marks displayed on the display unit 2 may have any designs by which the occupant can recognize the meanings of the respective marks, but they must be presented in shapes and colors that the occupant can easily distinguish. The 20/48 marks indicated by the arrows B1 to B6 displayed on the display unit 2 allow the occupant, in an easy and intuitive way, to distinguish the respective actions performed by the autonomous vehicle C.
[041] [041] In the example shown in Figure 6, the arrow B3 completely hatched denotes that the current action is a change of direction to the adjacent right lane, and the arrow B6 with the peripheral edge only hatched denotes that the action next after a lapse of time is a turn to the right. The display unit 2 changes the arrow indications B1 to B6 depending on the control by the action processing unit 32, so that the occupant can distinguish the current action of the autonomous vehicle C and the next action after a lapse of time. .
[042] [042] As shown in Figure 7, the autonomous vehicle C shown at position C0 is assumed to be moving in a straight line along the driving route in the left lane of the road with two lanes in each direction, and approaching the intersection. from the front. When autonomous driving is continued, autonomous vehicle C continues to move in a straight line at position C00 after a predetermined period of time. The display unit 2 only shows the arrow A2 denoting that the current action is a forward movement. It is assumed that the occupant wants to make a right turn at the intersection ahead. However, autonomous vehicle C is already entering a lane boundary indicating that lane changes are prohibited, and the right turn made at position C0 is an illegal action.
[043] [043] The action determination unit 33 determines that the autonomous vehicle C cannot make a right turn when moving around position C0, according to the information acquired by the vehicle acquisition unit. information 41, the peripheral information of the autonomous vehicle C detected by the detection unit 42, and the traffic laws and regulations. The display unit 2, for example, changes the indication of the arrow A3 denoting the curve to the right, as shown in Figure 8, as determined by the action determination unit 33, and shows the arrow A3 with a darker color than the arrows A1 and A2. Instruction unit 1 prohibits operation by the occupant in the region corresponding to arrow A3 in association with the change of indication made by display unit 2.
[044] [044] Presentation unit 2 changes the mark indication to prohibit autonomous vehicle C from taking the corresponding action, depending on road information, signage, or traffic signs, so that the mark is indicated differently from the others. marks, which allows the occupant to intuitively select another mark for the next action without confusion. Since instruction unit 1 prohibits operation by the occupant in association with the change of indication made by display unit 2, autonomous vehicle C can be controlled to drive safely and lawfully. Fifth example of operation
[045] [045] As shown in Figure 9, it is assumed that autonomous vehicle C is moving in the left lane of the road with two lanes in each direction, another vehicle D is moving at a lower speed in front of autonomous vehicle C , and controller 3 then selects the action to change the lane to overtake vehicle D. As shown in Figure 10, for example, display unit 2 changes the indication of each arrow B1 denoting the change in direction for the left lane, arrow B4 indicating left turn, and arrow B6 indicating right turn, so as to show these arrows in a darker color than the other arrows B2, B3 and B5. Instruction unit 1 prohibits operation by the occupant in the regions corresponding to arrows B1, B4, B6 and in association with the change of indication made by the display unit 2.
[046] [046] Display unit 2 informs the occupant that the current action is a forward movement as indicated by the fully hatched arrow B2, and that the next action, after a lapse of time, is a change of direction to the lane of the adjacent right as indicated by arrow B3 with the peripheral edge only hatched. It is then assumed that the occupant wants to maintain straight line movement because safety takes precedence over any issue.
[047] [047] As shown in Figure 11, instruction unit 1 instructs controller 3 to maintain straight line movement according to the operation performed by the occupant in the region corresponding to arrow B2 indicating forward movement. Thus, the overtaking action of vehicle D selected by controller 3 is canceled, so that autonomous vehicle C can safely maintain straight-line motion, while at the same time maintaining sufficient distance from the vehicle in front. D, as shown in Figure 12.
[048] [048] The vehicle operating device 8 included in the autonomous vehicle C according to the embodiment of the present invention shows the occupant the current action of the autonomous vehicle C and the next action performed after a lapse of time, so that the occupant can easily determine whether the intention itself is in accordance with the action of the autonomous vehicle C to be performed. The occupant can therefore immediately change the action of autonomous vehicle C when the occupant's intention is not in accordance with the action selected by autonomous vehicle C.
[049] [049] According to the vehicle operation device 8, the display unit 2 shows the actions done in at least three directions, the forward direction, the right direction, and the left direction, so as to easily distinguish the actions taken by the autonomous vehicle C.
[050] [050] According to vehicle operation device 8, instruction unit 1 shows arrows indicating the directions in which autonomous vehicle C moves, so that the occupant can intuitively distinguish the actions of autonomous vehicle C.
[051] [051] According to the vehicle operating device 8, the display unit 2 changes the indications of the marks denoting the respective actions of the autonomous vehicle C depending on the determination made by the action determining unit 33, so that the occupant can easily recognize which action cannot be performed, which contributes to maintaining autonomous driving safely and regal.
[052] [052] According to vehicle operation device 8, instruction unit 1 prohibits operation by the occupant according to the determination made by action determination unit 33, so as to control autonomous vehicle C to drive safely and royally.
[053] [053] According to the vehicle operating device 8, the action determining unit 33 determines whether to allow actions to be taken by the autonomous vehicle C according to road information that the detection unit 42 cannot detect, the which contributes to maintaining autonomous driving safely and reliably.
[054] [054] According to vehicle operation device 8, display unit 2 indicates the current action of autonomous vehicle C and the next action performed after a time lapse differently from each other, so that the occupant can easily distinguish the current action and the next action after a lapse of time.
[055] [055] According to vehicle operating device 8, the occupant operates instruction unit 1 in order to directly instruct autonomous vehicle C to make a lane change to pass another vehicle moving in front of the vehicle 24/48 autonomous vehicle c.
[056] [056] According to vehicle operating device 8, the occupant can purposely stop the autonomous vehicle C regardless of the setting or selection made by the autonomous vehicle C, so as to make way for other vehicles or pedestrians or make a stop for viewing scenes.
[057] [057] According to the vehicle operating device 8, the instruction unit 1 and the display unit 2 integrally make up a touch panel screen, so as to allow the occupant to recognize the displayed actions of the autonomous vehicle C more intuitively. , and provide the occupant with other information such as suitability of each action more clearly.
[058] [058] While the present invention has been described above with reference to embodiment, the present invention is not intended to be limited to the instructions and drawings that form part of this disclosure. Various alternative embodiments, examples and practical techniques will be apparent to those skilled in the art from this disclosure.
[059] [059] For example, in the above-described embodiment, the instruction unit 1 can be various types of input devices, such as dial-type, lever-type, and button-type input devices, as shown in Figure 13(a), the Figure 13(c), to instruct directions to move. Although not shown in the drawings, the display unit 2 may provide voice information from a loudspeaker, and the instruction unit 1 may be a voice input device, such as a microphone, in order to instruct the vehicle. Autonomous C to make a forward motion or a right/left turn through a voice operation performed by the occupant.
[060] [060] The present invention, of course, includes various embodiments not described in the present description, such as configurations including the various modalities and the first to fifth mutually applied operating examples. Therefore, the scope of the present invention is defined only by the appropriate resources according to the claims in view of the explanations made above.
[061] [061] The entire contents of Japanese Patent Application No. 2014-054458 (filed March 18, 2014) is incorporated herein by reference.
[062] [062] In accordance with the present invention, a vehicle operating device can be provided that shows the occupant a current action of the autonomous vehicle and a next action performed after a lapse of time, so that the occupant can immediately change the action of the autonomous vehicle when the action selected by the autonomous vehicle does not conform to the occupant's intention.
List of reference signals A1 to A3, B1 to B6 ARROW
C AUTONOMOUS VEHICLE 1 INSTRUCTION UNIT 2 PRESENTATION UNIT 8 VEHICLE OPERATING DEVICE 33 ACTION DETERMINATION UNIT 41 INFORMATION ACQUISITION UNIT 42 DETECTION UNIT 26/48
权利要求:
Claims (13)
[1]
1. Vehicle operating device (8) used in an autonomous vehicle (C) autonomously controlled to drive along a determined driving route, the vehicle operating device (8) CHARACTERIZED in that it comprises: a controller that autonomously controls the autonomous vehicle to drive along the determined driving route; a presentation unit (2) configured to simultaneously show an autonomous vehicle occupant (C) a controller-controlled current autonomous vehicle (C) action, a controller-controlled next action to be performed after a lapse of predetermined time, and an action that the occupant of the autonomous vehicle (C) can select; and an instruction unit (1) configured to designate intended actions in accordance with an occupant operation as the actions performed by the autonomous vehicle (C) shown in the display unit (2), and instruct the autonomous vehicle (C) to perform the designated actions according to the occupant operation, where the display unit (2) indicates the current controller-controlled autonomous vehicle action (C), the next controller-controlled action to be performed after the predetermined time lapse, and the action that the occupant of the autonomous vehicle (C) can select, with different designs from each other, where the display unit (2) shows the action selected by the occupant as the current action when the action selected by the occupant is initiated.
[2]
2. Vehicle operating device (8) used in an autonomous vehicle (C) autonomously controlled to drive along a determined driving route, the vehicle operating device (8) CHARACTERIZED in that it comprises:
a controller that autonomously controls the autonomous vehicle to drive along the determined driving route; a display unit (2) configured to simultaneously show, via an arrow indicating a forward direction, to an occupant of the autonomous vehicle (C), a current action of the autonomous vehicle (C) controlled by the controller, a next controller-controlled action to be performed after a predetermined period of time, and an action that the occupant of the autonomous vehicle (C) can select; and an instruction unit (1) configured to designate intended actions in accordance with an occupant operation as the actions performed by the autonomous vehicle (C) shown in the display unit (2), and instruct the autonomous vehicle (C) to perform designated actions according to occupant operation, where the display unit (2) shows the occupant-selected action as the current action when the occupant-selected action is initiated.
[3]
3. Vehicle operating device (8) used in an autonomous vehicle (C) autonomously controlled to drive along a determined driving route, the vehicle operating device (8) CHARACTERIZED in that it comprises: a controller that autonomously controls the autonomous vehicle to drive along the determined driving route; a display unit (2) configured to simultaneously show through three arrows (A1, A2, A3) indicating a forward direction, to an occupant of the autonomous vehicle (C), a current action of the autonomous vehicle (C) controlled - triggered by the controller, a next controller-controlled action to be performed after a predetermined period of time, and an action that the occupant of the autonomous vehicle (C) can select; and an instruction unit (1) configured to designate intended actions in accordance with an occupant operation as the actions performed by the autonomous vehicle (C) shown in the display unit (2), and instruct the autonomous vehicle (C) to perform designated actions according to occupant operation, where the display unit (2) shows the occupant-selected action as the current action when the occupant-selected action is initiated.
[4]
4. Vehicle operating device (8), according to any one of claims 1 to 3, CHARACTERIZED by the fact that the display unit (2) shows the current action of the autonomous vehicle (C) controlled by the controller, the next controller-controlled action to be performed after the predetermined time lapse, and the action that the occupant of the autonomous vehicle (C) can select, by arranging these actions in the presentation unit (2).
[5]
5. Vehicle operating device (8), according to any one of claims 1 to 4, CHARACTERIZED by the fact that the presentation unit (2) indicates directions in which the autonomous vehicle (C) moves to the current action and the next action performed after the predetermined time lapse, selected from at least three directions including a forward direction, a right direction, and a left direction.
[6]
6. Vehicle operating device (8), according to any one of claims 1 to 5, CHARACTERIZED by the fact that when the occupant operates arrows (A1, A2, A3, B1, B2, B3, B4, B5, B6) indicating the directions in which the autonomous vehicle (C) moves for the actions performed by the autonomous vehicle (C), the instruction unit (1) instructs the autonomous vehicle (C) to move in the indicated directions by the arrows (A1, A2, A3, B1, B2, B3, B4, B5, B6) operated by the occupant.
[7]
7. Vehicle operation device (8), according to any one of claims 1 to 6, CHARACTERIZED by the fact that it also comprises:
a detection unit (42) configured to detect peripheral information from the autonomous vehicle (C); and an action determination unit (33) configured to determine whether to allow each action performed by the autonomous vehicle (C) in accordance with the information detected by the detection unit (42) and traffic laws and regulations, where the presentation unit (2) changes an indication of a mark that denotes each action performed by the autonomous vehicle (C) depending on the determination through the action determination unit (33).
[8]
8. Vehicle operating device (8), according to claim 7, CHARACTERIZED by the fact that the instruction unit (1) prohibits the occupant's operation depending on the determination through the action determination unit (33).
[9]
9. Vehicle operation device (8), according to claim 7 or 8, CHARACTERIZED in that it further comprises an information acquisition unit (41) configured to externally acquire road information about the driving route via wireless communication, where the action determination unit (33) determines whether to allow each action performed by the autonomous vehicle (C) according to the road information acquired by the information acquisition unit (41), the information detected by the information, and traffic laws and regulations.
[10]
10. Vehicle operating device (8) according to any one of claims 1 to 9, CHARACTERIZED by the fact that when the current controller-controlled action and the next controller-controlled action to be performed after the lapse of predetermined time are different from each other, the display unit (2) provides the occupant with tags denoting the current controller-controlled action and the next controller-controlled action to be performed after the predetermined time lapse, indicated differently one from the other.
[11]
11. Vehicle operating device (8), according to any one of claims 1 to 10, CHARACTERIZED by the fact that the instruction unit (1) instructs the autonomous vehicle (C) to make a change of direction to a lane adjacent road according to occupant operation when the autonomous vehicle (C) drives on a divided lane road.
[12]
12. Vehicle operating device (8), according to any one of claims 1 to 11, CHARACTERIZED by the fact that the instruction unit (1) instructs the autonomous vehicle (C) to make a stop in accordance with the operation of the occupant.
[13]
13. Vehicle operating device (8), according to any one of claims 1 to 12, CHARACTERIZED by the fact that the instruction unit (1) and the display unit (2) serve integrally as a dashboard screen. tap to show the marks that denote the actions performed by the autonomous vehicle (C), so that the regions corresponding to the marks are operated by the occupant.
类似技术:
公开号 | 公开日 | 专利标题
BR112016021450A2|2021-08-17|VEHICLE OPERATING DEVICE
JPWO2016035118A1|2017-04-27|Vehicle irradiation control system and image irradiation control method
BR112016004552B1|2021-02-17|vehicle driving guidance device and method
JP2006113918A|2006-04-27|Driving support device
JP4497316B2|2010-07-07|Vehicle information display device
US20120109521A1|2012-05-03|System and method of integrating lane position monitoring with locational information systems
EP2963632A1|2016-01-06|Manoeuvre assistance
JP6398957B2|2018-10-03|Vehicle control device
US20200393263A1|2020-12-17|Overlaying additional information on a display unit
US20190009782A1|2019-01-10|Second stop position for intersection turn
JP4539731B2|2010-09-08|Intersection information notification device, storage medium, and program for intersection information notification device
RU2687393C1|2019-05-13|Lane display device and lane display method
JP2016048226A|2016-04-07|Information processing apparatus, information processing method, and program for information processing and recording medium
JPWO2019175956A1|2020-05-28|Display control device, display device, and display control method
US11248926B2|2022-02-15|Display control device and display control method
ES2887103T3|2021-12-21|Method for calculating an additional information overlay for a display on a display unit, device for carrying out the method, as well as motor vehicle and computer program
JP6687123B2|2020-04-28|Route guidance method and route guidance device
RU2754705C1|2021-09-06|Method for driving assistance and device for driving assistance
WO2020194016A1|2020-10-01|Abnormal state notification method and abnormal state notification device
US20140354668A1|2014-12-04|Device and Method for Displaying a Map According to the Guiding of a Navigation System
JP2020163931A|2020-10-08|Display device for vehicle
JP2019095977A|2019-06-20|Reverse travel warning device
BR102013003757B1|2020-07-28|navigation method and navigation system
同族专利:
公开号 | 公开日
RU2669910C2|2018-10-16|
MX2016011781A|2016-10-28|
JPWO2015141308A1|2017-04-06|
MX358890B|2018-09-07|
JP6609240B2|2019-11-20|
EP3121085B1|2019-10-16|
EP3121085A4|2017-03-29|
CN106103231A|2016-11-09|
EP3121085A1|2017-01-25|
WO2015141308A1|2015-09-24|
CN106103231B|2020-08-04|
RU2016140246A|2018-04-20|
US20170151958A1|2017-06-01|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

JP3564547B2|1995-04-17|2004-09-15|本田技研工業株式会社|Automatic driving guidance device|
JPH08305994A|1995-05-08|1996-11-22|Fujitsu Ltd|Automatic guidance system|
JPH09128686A|1995-10-31|1997-05-16|Honda Motor Co Ltd|Display device for vehicle|
JPH09184737A|1995-12-28|1997-07-15|Mitsumi Electric Co Ltd|Navigation system|
JP3967061B2|2000-03-28|2007-08-29|アルパイン株式会社|Navigation device|
JP3864846B2|2002-05-21|2007-01-10|株式会社デンソー|Navigation device|
JP2004239740A|2003-02-05|2004-08-26|Aisin Aw Co Ltd|Navigation device|
JP2006284218A|2005-03-31|2006-10-19|Xanavi Informatics Corp|Navigation device|
JP2008032629A|2006-07-31|2008-02-14|Xanavi Informatics Corp|Navigation system|
JP2008157777A|2006-12-25|2008-07-10|Xanavi Informatics Corp|Navigation device|
JP2009015498A|2007-07-03|2009-01-22|Denso Corp|Emergency vehicle approach notification system, device for general car and device for emergency car|
WO2009095994A1|2008-01-29|2009-08-06|Pioneer Corporation|Navigation device, navigation method, navigation program, and recording medium|
JP2009248598A|2008-04-01|2009-10-29|Toyota Motor Corp|Road surface depiction device|
JP2010083314A|2008-09-30|2010-04-15|Fuji Heavy Ind Ltd|Driving support device for vehicle|
JP4614005B2|2009-02-27|2011-01-19|トヨタ自動車株式会社|Moving locus generator|
JP2011158347A|2010-02-01|2011-08-18|Yokogawa Electric Corp|Semiconductor device and inspection system|
JP2011162132A|2010-02-12|2011-08-25|Toyota Motor Corp|Automatic driving device|
RU2432276C1|2010-07-07|2011-10-27|Осман Мирзаевич Мирза|Method of observing traffic situation from moving transport facility |
JP2012083355A|2011-11-28|2012-04-26|Pioneer Electronic Corp|Display control device, display control method, display control program and recording medium|
JP2012083358A|2011-12-08|2012-04-26|Pioneer Electronic Corp|Navigation device, navigation method, navigation program and recording medium|JP6642972B2|2015-03-26|2020-02-12|修一 田山|Vehicle image display system and method|
JP2016199204A|2015-04-14|2016-12-01|トヨタ自動車株式会社|Vehicle control device|
DE102015213181B3|2015-07-14|2017-01-19|Bayerische Motoren Werke Aktiengesellschaft|Longitudinal driver assistance system in a motor vehicle|
JP5957744B1|2015-07-31|2016-07-27|パナソニックIpマネジメント株式会社|Driving support device, driving support system, driving support method, driving support program, and autonomous driving vehicle|
JP5957745B1|2015-07-31|2016-07-27|パナソニックIpマネジメント株式会社|Driving support device, driving support system, driving support method, driving support program, and autonomous driving vehicle|
US9566986B1|2015-09-25|2017-02-14|International Business Machines Corporation|Controlling driving modes of self-driving vehicles|
BR112018006456A2|2015-09-30|2018-10-09|Nissan Motor Co., Ltd.|An information presenting device and an information presenting method|
US9834224B2|2015-10-15|2017-12-05|International Business Machines Corporation|Controlling driving modes of self-driving vehicles|
US9944291B2|2015-10-27|2018-04-17|International Business Machines Corporation|Controlling driving modes of self-driving vehicles|
JP6540453B2|2015-10-28|2019-07-10|株式会社デンソー|Information presentation system|
US10607293B2|2015-10-30|2020-03-31|International Business Machines Corporation|Automated insurance toggling for self-driving vehicles|
US10176525B2|2015-11-09|2019-01-08|International Business Machines Corporation|Dynamically adjusting insurance policy parameters for a self-driving vehicle|
US10061326B2|2015-12-09|2018-08-28|International Business Machines Corporation|Mishap amelioration based on second-order sensing by a self-driving vehicle|
DE102016200897A1|2016-01-22|2017-07-27|Bayerische Motoren Werke Aktiengesellschaft|Method and device for at least partially automated driving|
US9836973B2|2016-01-27|2017-12-05|International Business Machines Corporation|Selectively controlling a self-driving vehicle's access to a roadway|
JP2017156874A|2016-02-29|2017-09-07|トヨタ自動車株式会社|Automatic driving system|
JP6558282B2|2016-03-09|2019-08-14|トヨタ自動車株式会社|Automated driving system|
JP6090727B2|2016-03-16|2017-03-08|パナソニックIpマネジメント株式会社|Driving support device, driving support system, driving support method, driving support program, and autonomous driving vehicle|
JP6541602B2|2016-03-30|2019-07-10|株式会社デンソーアイティーラボラトリ|Route generation apparatus, route generation system, route generation method and route generation program|
JP6542153B2|2016-03-30|2019-07-10|株式会社デンソーアイティーラボラトリ|Route generation apparatus, route generation system, route generation method and route generation program|
US10657745B2|2016-04-19|2020-05-19|Be Topnotch Llc|Autonomous car decision override|
US9889861B2|2016-04-19|2018-02-13|Hemanki Doshi|Autonomous car decision override|
JP6524956B2|2016-04-27|2019-06-05|株式会社デンソー|Driving support device and center|
KR102360154B1|2016-05-17|2022-02-09|현대자동차주식회사|Apparatus and Method which considers User Setting|
US10685391B2|2016-05-24|2020-06-16|International Business Machines Corporation|Directing movement of a self-driving vehicle based on sales activity|
US10737702B2|2016-07-27|2020-08-11|Toyota Motor Engineering & Manufacturing North America, Inc.|Visually simulating driving plans in autonomous vehicles|
US10093322B2|2016-09-15|2018-10-09|International Business Machines Corporation|Automatically providing explanations for actions taken by a self-driving vehicle|
US10643256B2|2016-09-16|2020-05-05|International Business Machines Corporation|Configuring a self-driving vehicle for charitable donations pickup and delivery|
US11067987B2|2016-09-23|2021-07-20|Nissan Motor Co., Ltd.|Driving assistance method and driving assistance device|
US10259452B2|2017-01-04|2019-04-16|International Business Machines Corporation|Self-driving vehicle collision management system|
US10363893B2|2017-01-05|2019-07-30|International Business Machines Corporation|Self-driving vehicle contextual lock control system|
US10529147B2|2017-01-05|2020-01-07|International Business Machines Corporation|Self-driving vehicle road safety flare deploying system|
KR20180084496A|2017-01-17|2018-07-25|엘지전자 주식회사|Vehicle and method for controlling display thereof|
US10152060B2|2017-03-08|2018-12-11|International Business Machines Corporation|Protecting contents of a smart vault being transported by a self-driving vehicle|
WO2018179252A1|2017-03-30|2018-10-04|本田技研工業株式会社|Vehicle control apparatus and vehicle control method|
US20180284783A1|2017-03-31|2018-10-04|GM Global Technology Operations LLC|Autonomous vehicle routing based on occupant characteristics|
US10290214B2|2017-04-11|2019-05-14|Denso International America, Inc.|Lane change system and lane change controller|
DE102017208646A1|2017-05-22|2018-11-22|Audi Ag|Method for operating a motor vehicle and a motor vehicle|
JP2018203009A|2017-06-02|2018-12-27|本田技研工業株式会社|Vehicle control system, vehicle control method, and program|
WO2019000391A1|2017-06-30|2019-01-03|华为技术有限公司|Vehicle control method, device, and apparatus|
DE102017213207A1|2017-08-01|2019-02-07|Bayerische Motoren Werke Aktiengesellschaft|Device for changing a transverse guide of a vehicle|
CN107745711B|2017-09-05|2021-01-05|百度在线网络技术(北京)有限公司|Method and device for determining route in automatic driving mode|
US20190079525A1|2017-09-11|2019-03-14|Qualcomm Incorporated|Autonomous vehicle support for secondary vehicle|
JP2019215214A|2018-06-12|2019-12-19|矢崎総業株式会社|Vehicle control system|
DE102018209754A1|2018-06-18|2019-12-19|Bayerische Motoren Werke Aktiengesellschaft|Method, device, mobile user device and computer program for providing information for use in a vehicle, and method, device and computer program for using information in a vehicle|
JP2021008142A|2019-06-28|2021-01-28|トヨタ自動車株式会社|Operation device of automatic operation vehicle|
DE102019127295B3|2019-10-10|2021-03-25|Schaeffler Technologies AG & Co. KG|Operating system for controlling a vehicle|
WO2022003849A1|2020-07-01|2022-01-06|三菱電機株式会社|Path determination device and path determination method|
法律状态:
2020-04-22| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]|
2021-07-20| B350| Update of information on the portal [chapter 15.35 patent gazette]|
2021-08-03| B350| Update of information on the portal [chapter 15.35 patent gazette]|
优先权:
申请号 | 申请日 | 专利标题
JP2014054458|2014-03-18|
JP2014-054458|2014-03-18|
PCT/JP2015/053035|WO2015141308A1|2014-03-18|2015-02-04|Vehicle operation device|
[返回顶部]